Prioritizing data privacy in advertising: A must for companies
Posted: October 30, 2024
In a significant move, LinkedIn has been fined €310 million (about $336 million) by the Irish Data Protection Commission (IDPC) for privacy violations related to its ad tracking business. Similarly, Pinterest is now under scrutiny as noyb accuses it of violating GDPR by not obtaining explicit consent from users for tracking and profiling. Currently, Pinterest tracks all its 136 million European users by default for ‘ads personalization,’ requiring users to manually opt-out if they do not wish to be tracked.
“Unsurprisingly, Pinterest’s business model is also based on personalized advertising and the associated user tracking. The problem: Despite a CJEU ruling prohibiting this practice, the platform uses people’s personal data without asking for their consent. Pinterest falsely claims to have a “legitimate interest” and enables tracking by default. Most other websites have abandoned this legally flawed argument years ago.” – noyb
The privacy group noyb references a landmark 2023 ruling by the European Union’s Court of Justice, which struck down Meta’s use of legitimate interest for targeted advertising. This decision sets a precedent, suggesting that Pinterest and similar platforms must secure explicit user consent for personalized advertising.
Implications for the digital advertising industry
The cases of LinkedIn and Pinterest highlight the growing challenges tech companies face in navigating data privacy regulations. These incidents serve as a wake-up call for the digital advertising industry, which heavily relies on user-level tracking for ad targeting and campaign measurement.
Our Privacy Beyond Borders report reveals that 92% of consumers believe companies prioritize profits over data protection. This perception persists in the realm of personalized ads. Despite the push for change, some experts predict that the industry will continue to resist transitioning to more transparent, consent-based tracking methods. Default tracking has long been the norm and remains highly profitable.
Personalized content keeps users engaged on platforms and drives purchases, raising questions about the boundaries of “legitimate interest.” Until regulators clearly define this concept, companies will continue to exploit its ambiguity.
A recent Forbes survey shows that 81% of customers prefer companies that offer a personalized experience, so where do companies draw the line between personalization and privacy? How much granular control do companies put into the hands of consumers and how much is assumed under “legitimate interest”. While the personalized experience is ideal, consumers are more aware than ever what data companies are taking from them, with or without consent.
The need for real change
For tech companies with substantial financial resources, fines for privacy violations are often seen as a cost of doing business, with little deterrent effect. Therefore, rulings that mandate substantial changes in business practices are more likely to drive meaningful impact. Joe Jones from IAPP notes that while monetary fines are significant, enforcement orders requiring changes in business practices, such as halting personalized advertising, will have a more profound effect on the digital advertising ecosystem.
Regulatory pressure from various angles can also drive significant changes in how organizations collect, process, use, and store consumer data. Dr. Rob van Eijk from the Future of Privacy Forum points to the US Federal Trade Commission’s use of consent decrees as an example.
Moving forward
Despite the complex regulatory landscape, the Digital Markets Act, effective since 2022, shares similar requirements with GDPR, interpreting consent and personal data in the same way. This overlap may help streamline compliance for some platforms.
Experts anticipate more aggressive regulatory enforcement of privacy laws globally by 2025, increasing the stakes for publishers, platforms, advertisers, and data brokers.
To avoid potentially intensive fines and remedies, organizations “should aim to empower consumers by giving them more control over their own data,” says Cassie’s David McInerney. There are a variety of means to do so, he explains. “This can look like implementing user-friendly interfaces that allow consumers to manage their data preferences, enabling features like consent management tools and providing clear opt-in and opt-out options. All of these methods show consumers that their trust is worthy of protection.”
By prioritizing data privacy, companies can not only comply with regulations but also foster stronger, trust-based relationships with their users.